首页> 外文OA文献 >Logical Parsing from Natural Language Based on a Neural Translation Model
【2h】

Logical Parsing from Natural Language Based on a Neural Translation Model

机译:基于神经翻译的自然语言逻辑解析   模型

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Semantic parsing has emerged as a significant and powerful paradigm fornatural language interface and question answering systems. Traditional methodsof building a semantic parser rely on high-quality lexicons, hand-craftedgrammars and linguistic features which are limited by applied domain orrepresentation. In this paper, we propose a general approach to learn fromdenotations based on Seq2Seq model augmented with attention mechanism. Weencode input sequence into vectors and use dynamic programming to infercandidate logical forms. We utilize the fact that similar utterances shouldhave similar logical forms to help reduce the searching space. Under ourlearning policy, the Seq2Seq model can learn mappings gradually with noises.Curriculum learning is adopted to make the learning smoother. We test ourmethod on the arithmetic domain which shows our model can successfully inferthe correct logical forms and learn the word meanings, compositionality andoperation orders simultaneously.
机译:语义解析已成为自然语言界面和问题回答系统的重要且强大的范例。构建语义解析器的传统方法依赖于高质量的词典,手工制作的语法和语言特征,这些特征受应用领域或表示法的限制。在本文中,我们提出了一种基于注意力增强的Seq2Seq模型从内涵中学习的通用方法。将输入序列编码为向量,并使用动态编程来推断候选的逻辑形式。我们利用这样的事实,即类似的话语应具有类似的逻辑形式,以帮助减少搜索空间。在我们的学习策略下,Seq2Seq模型可以逐步学习带有噪声的映射,并采用课程学习来使学习更加顺畅。我们在算术域上测试了我们的方法,这表明我们的模型可以成功地推断出正确的逻辑形式,并且可以同时学习单词的含义,组成和操作顺序。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号